Differential networks for processing structural dependencies in human language: linguistic capacity vs. memory-based ordering
نویسندگان
چکیده
Surface linear (left-to-right) arrangements of human languages are actually an amalgam the core language system and systems that not inherently related to language. It has been widely recognized unbounded array hierarchically structured linguistic expressions is generated by simplest combinatorial operation “Merge,” notion Merge-generability proposed as a key feature characterizes structural dependencies among elements. Here we tested Merge-generable using Subject-Predicate matching task, which required both capacity short-term memory. We used three types dependency: Nesting, Crossing, Grouping control. The Nesting dependency totally Merge-generable, while Crossing requires some additional processes for memory-based ordering. In order identify regions employed these two dependencies, directly compared cortical responses sentence stimuli (with noun phrases adverb first half stimuli, with verbs latter) functional magnetic resonance imaging (fMRI), following results were obtained. First, – contrast, significant activations observed in bilateral lateral premotor cortices (LPMCs) inferior frontal gyri, left middle temporal gyrus, angular/supramarginal indicating engagement syntax-related networks. contrast showed focal fusiform lingual occipital gyrus (L. FG/LG/MOG). Secondly, signal changes LPMCs well fitted estimates computational costs search workspace select items (Σ operations). Moreover, latter L. FG/LG/MOG differentially loads ordering elements/words (numbers Ordering). Thirdly, fitting models far more likely than exchanged between FG/LG/MOG, confirming double dissociation primary Σ Ordering. conclusion, indicate separate networks employed, their careful elucidation will provide further insights challenges.
منابع مشابه
Memory-Based Language Processing
ion is introduced. Since MBLP does not abstract over the training data, it is called a lazy learning approach. Rule induction, in contrast, learns rules and does not go back to the actual training data during classification. ∗ A shorter version of this review will be published in German in the journal Linguistische Berichte. Computational Linguistics Volume 32, Number 4 The book consists of 7 c...
متن کاملAsk Me Anything: Dynamic Memory Networks for Natural Language Processing
Most tasks in natural language processing can be cast into question answering (QA) problems over language input. We introduce the dynamic memory network (DMN), a unified neural network framework which processes input sequences and questions, forms semantic and episodic memories, and generates relevant answers. Questions trigger an iterative attention process which allows the model to condition ...
متن کاملDependencies vs. Constituents for Tree-Based Alignment
Given a parallel parsed corpus, statistical treeto-tree alignment attempts to match nodes in the syntactic trees for a given sentence in two languages. We train a probabilistic tree transduction model on a large automatically parsed Chinese-English corpus, and evaluate results against human-annotated word level alignments. We find that a constituent-based model performs better than a similar pr...
متن کاملRACAI's Natural Language Processing pipeline for Universal Dependencies
This paper presents RACAI’s approach, experiments and results at CoNLL 2017 Shared Task: Multilingual Parsing from Raw Text to Universal Dependencies. We handle raw text and we cover tokenization, sentence splitting, word segmentation, tagging, lemmatization and parsing. All results are reported under strict training, development and testing conditions, in which the corpora provided for the sha...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Frontiers in Psychology
سال: 2023
ISSN: ['1664-1078']
DOI: https://doi.org/10.3389/fpsyg.2023.1153871